Nl Q Theory: Checking and Imposing Stability of Recurrent Neural Networks for Nonlinear Modelling
نویسنده
چکیده
It is known that many discrete time recurrent neural networks, such as e.g. neural state space models, multilayer Hoppeld networks and locally recurrent globally feedforward neural networks, can be represented as NL q systems. Suucient conditions for global asymptotic stability and input/output stability of NL q systems are available, including three types of criteria: diagonal scaling and criteria depending on diagonal dominance and condition number factors of certain matrices. In this paper, it is discussed how Narendra's dynamic backpropagation procedure, used for identifying recurrent neural network from I/O measurements, can be modiied with an NL q stability constraint in order to ensure globally asymptotically stable identiied models. An example illustrates how system identiication of an internally stable model, corrupted by process noise, may lead to unwanted limit cycle behaviour and how this problem can be avoided by adding the stability constraint.
منابع مشابه
Nl Q Theory: a Unifying Framework for Analysis, Design and Applications of Complex Nonlinear Systems 1
NLq systems represent a large class of discrete time nonlinear dynamical systems in state space form that contain a number of q layers with alternating linear and nonlinear operators that satisfy a sector condition. Many problems arising in system, circuit and control theory and recurrent neural networks can be transformed into such NLq systems. Suucient conditions for global asymptotic stabili...
متن کاملRobust stability of stochastic fuzzy impulsive recurrent neural networks with\ time-varying delays
In this paper, global robust stability of stochastic impulsive recurrent neural networks with time-varyingdelays which are represented by the Takagi-Sugeno (T-S) fuzzy models is considered. A novel Linear Matrix Inequality (LMI)-based stability criterion is obtained by using Lyapunov functional theory to guarantee the asymptotic stability of uncertain fuzzy stochastic impulsive recurrent neural...
متن کاملDiscrete Time Generalized Cellular Neural Networks within Nl Q Theory
Generalized Cellular Neural Networks (GCCNs) were recently introduced by Guzelis & Chua. Instead of one single CNN, a set of CNNs was considered, interconnected in a feedforward, cascade or feedback way. The framework was in continuous time with su cient conditions for global asymptotic and I/O stability and the relation with classical nonlinear control theory such as the Lur'e problem was reve...
متن کاملMulti-Step-Ahead Prediction of Stock Price Using a New Architecture of Neural Networks
Modelling and forecasting Stock market is a challenging task for economists and engineers since it has a dynamic structure and nonlinear characteristic. This nonlinearity affects the efficiency of the price characteristics. Using an Artificial Neural Network (ANN) is a proper way to model this nonlinearity and it has been used successfully in one-step-ahead and multi-step-ahead prediction of di...
متن کاملGeneralized Cellular Neural Networks Represented in he NLq Framework
The aim of this paper is to show that discrete time Generalized Cellular Neural Networks, with feedforward, feedback or cascade interconnections between CNNs can be represented as NL q s, a concept introduced in 12]. NL q s are nonlinear systems in state space form with the typical feature of having a number of q layers with alternating linear and nonlinear operators that satisfy a sector condi...
متن کامل